415 research outputs found

    Properties and use of CMB power spectrum likelihoods

    Full text link
    Fast robust methods for calculating likelihoods from CMB observations on small scales generally rely on approximations based on a set of power spectrum estimators and their covariances. We investigate the optimality of these approximation, how accurate the covariance needs to be, and how to estimate the covariance from simulations. For a simple case with azimuthal symmetry we compare optimality of hybrid pseudo-C_l CMB power spectrum estimators with the exact result, indicating that the loss of information is not negligible, but neither is it enough to have a large effect on standard parameter constraints. We then discuss the number of samples required to estimate the covariance from simulations, with and without a good analytic approximation, and assess the use of shrinkage estimators. Finally we discuss how to combine an approximate high-ell likelihood with a more exact low-ell harmonic-space likelihood as a practical method for accurate likelihood calculation on all scales.Comment: 15 pages, 11 figures; updated to match version accepted by PR

    A simple scheme for allocating capital in a foreign exchange proprietary trading firm

    Get PDF
    We present a model of capital allocation in a foreign exchange proprietary trading firm. The owner allocates capital to individual traders, who operate within strict risk limits. Traders specialize in individual currencies, but are given discretion over their choice of trading rule. The owner provides the simple formula that determines position sizes – a formula that does not require estimation of the firm-level covariance matrix. We provide supporting empirical evidence of excess risk-adjusted returns to the firm-level portfolio, and we discuss a modification of the model in which the owner dictates the choice of trading rule

    Detection of brain functional-connectivity difference in post-stroke patients using group-level covariance modeling

    Get PDF
    Functional brain connectivity, as revealed through distant correlations in the signals measured by functional Magnetic Resonance Imaging (fMRI), is a promising source of biomarkers of brain pathologies. However, establishing and using diagnostic markers requires probabilistic inter-subject comparisons. Principled comparison of functional-connectivity structures is still a challenging issue. We give a new matrix-variate probabilistic model suitable for inter-subject comparison of functional connectivity matrices on the manifold of Symmetric Positive Definite (SPD) matrices. We show that this model leads to a new algorithm for principled comparison of connectivity coefficients between pairs of regions. We apply this model to comparing separately post-stroke patients to a group of healthy controls. We find neurologically-relevant connection differences and show that our model is more sensitive that the standard procedure. To the best of our knowledge, these results are the first report of functional connectivity differences between a single-patient and a group and thus establish an important step toward using functional connectivity as a diagnostic tool

    An Evolutionary Optimization Approach to Risk Parity Portfolio Selection

    Full text link
    In this paper we present an evolutionary optimization approach to solve the risk parity portfolio selection problem. While there exist convex optimization approaches to solve this problem when long-only portfolios are considered, the optimization problem becomes non-trivial in the long-short case. To solve this problem, we propose a genetic algorithm as well as a local search heuristic. This algorithmic framework is able to compute solutions successfully. Numerical results using real-world data substantiate the practicability of the approach presented in this paper

    Adaptive Evolutionary Clustering

    Full text link
    In many practical applications of clustering, the objects to be clustered evolve over time, and a clustering result is desired at each time step. In such applications, evolutionary clustering typically outperforms traditional static clustering by producing clustering results that reflect long-term trends while being robust to short-term variations. Several evolutionary clustering algorithms have recently been proposed, often by adding a temporal smoothness penalty to the cost function of a static clustering method. In this paper, we introduce a different approach to evolutionary clustering by accurately tracking the time-varying proximities between objects followed by static clustering. We present an evolutionary clustering framework that adaptively estimates the optimal smoothing parameter using shrinkage estimation, a statistical approach that improves a naive estimate using additional information. The proposed framework can be used to extend a variety of static clustering algorithms, including hierarchical, k-means, and spectral clustering, into evolutionary clustering algorithms. Experiments on synthetic and real data sets indicate that the proposed framework outperforms static clustering and existing evolutionary clustering algorithms in many scenarios.Comment: To appear in Data Mining and Knowledge Discovery, MATLAB toolbox available at http://tbayes.eecs.umich.edu/xukevin/affec

    Testing linear hypotheses in high-dimensional regressions

    Full text link
    For a multivariate linear model, Wilk's likelihood ratio test (LRT) constitutes one of the cornerstone tools. However, the computation of its quantiles under the null or the alternative requires complex analytic approximations and more importantly, these distributional approximations are feasible only for moderate dimension of the dependent variable, say p20p\le 20. On the other hand, assuming that the data dimension pp as well as the number qq of regression variables are fixed while the sample size nn grows, several asymptotic approximations are proposed in the literature for Wilk's \bLa including the widely used chi-square approximation. In this paper, we consider necessary modifications to Wilk's test in a high-dimensional context, specifically assuming a high data dimension pp and a large sample size nn. Based on recent random matrix theory, the correction we propose to Wilk's test is asymptotically Gaussian under the null and simulations demonstrate that the corrected LRT has very satisfactory size and power, surely in the large pp and large nn context, but also for moderately large data dimensions like p=30p=30 or p=50p=50. As a byproduct, we give a reason explaining why the standard chi-square approximation fails for high-dimensional data. We also introduce a new procedure for the classical multiple sample significance test in MANOVA which is valid for high-dimensional data.Comment: Accepted 02/2012 for publication in "Statistics". 20 pages, 2 pages and 2 table

    Efficient template attacks

    Get PDF
    This is the accepted manuscript version. The final published version is available from http://link.springer.com/chapter/10.1007/978-3-319-08302-5_17.Template attacks remain a powerful side-channel technique to eavesdrop on tamper-resistant hardware. They model the probability distribution of leaking signals and noise to guide a search for secret data values. In practice, several numerical obstacles can arise when implementing such attacks with multivariate normal distributions. We propose efficient methods to avoid these. We also demonstrate how to achieve significant performance improvements, both in terms of information extracted and computational cost, by pooling covariance estimates across all data values. We provide a detailed and systematic overview of many different options for implementing such attacks. Our experimental evaluation of all these methods based on measuring the supply current of a byte-load instruction executed in an unprotected 8-bit microcontroller leads to practical guidance for choosing an attack algorithm.Omar Choudary is a recipient of the Google Europe Fellowship in Mobile Security, and this research is supported in part by this Google Fellowship

    Accounting for risk of non linear portfolios: a novel Fourier approach

    Full text link
    The presence of non linear instruments is responsible for the emergence of non Gaussian features in the price changes distribution of realistic portfolios, even for Normally distributed risk factors. This is especially true for the benchmark Delta Gamma Normal model, which in general exhibits exponentially damped power law tails. We show how the knowledge of the model characteristic function leads to Fourier representations for two standard risk measures, the Value at Risk and the Expected Shortfall, and for their sensitivities with respect to the model parameters. We detail the numerical implementation of our formulae and we emphasizes the reliability and efficiency of our results in comparison with Monte Carlo simulation.Comment: 10 pages, 12 figures. Final version accepted for publication on Eur. Phys. J.
    corecore